The Ll Convergence of Kernel Density Estimates1

نویسنده

  • T. J. WAGNER
چکیده

fn(x) ( 1 /n)h,71K((x ~= X,)/h) where the kernel K is a bounded probability density on Rd and {h} n is a sequence of positive numbers. We are concerned here with the conditions of f, K and {h} n which insure the L 1 convergence offn to f, namely, (1) f Rdlfn(x) f(x)I dx -n0 in probability (or w.p.1) . This concern is motivated by the observation (Schef fe (1947)) that THE Ll CONVERGENCE OF KERNEL DENSITY ESTIMATES1

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Berry-Esseen Type Bound for the Kernel Density Estimator of Length-Biased Data

Length-biased data are widely seen in applications. They are mostly applicable in epidemiological studies or survival analysis in medical researches. Here we aim to propose a Berry-Esseen type bound for the kernel density estimator of this kind of data.The rate of normal convergence in the proposed Berry-Esseen type theorem is shown to be O(n^(-1/6) ) modulo logarithmic term as n tends to infin...

متن کامل

The Relative Improvement of Bias Reduction in Density Estimator Using Geometric Extrapolated Kernel

One of a nonparametric procedures used to estimate densities is kernel method. In this paper, in order to reduce bias of  kernel density estimation, methods such as usual kernel(UK), geometric extrapolation usual kernel(GEUK), a bias reduction kernel(BRK) and a geometric extrapolation bias reduction kernel(GEBRK) are introduced. Theoretical properties, including the selection of smoothness para...

متن کامل

Almost Sure Convergence of Kernel Bivariate Distribution Function Estimator under Negative Association

Let {Xn ,n=>1} be a strictly stationary sequence of negatively associated random variables, with common distribution function F. In this paper, we consider the estimation of the two-dimensional distribution function of (X1, Xk+1) for fixed $K /in N$ based on kernel type estimators. We introduce asymptotic normality and properties and moments. From these we derive the optimal bandwidth...

متن کامل

THE EQUIVALENCE OF WEAK, STRONG AND COMPLETE CONVERGENCE IN Ll FOR KERNEL DENSITY ESTIMATES' BY LUC DEVROYE

fn(x) _ (nh d ) ' ~= 1 K((x X1)lh) where h = h n is a sequence of positive numbers, and K is an absolutely integrable function with f K(x) dx =1 . Let J, = f l f ,(x) f (x) ( dx. We show that when limnh = 0 and limnnh d = oo, then for every e > 0 there exist constants r, no > 0 such that P(Jn > e) <_ exp(-rn), n ? no. Also, when J, -p 0 in probability as n --p oo and K is a density, then limnh ...

متن کامل

Asymptotic Behaviors of Nearest Neighbor Kernel Density Estimator in Left-truncated Data

Kernel density estimators are the basic tools for density estimation in non-parametric statistics.  The k-nearest neighbor kernel estimators represent a special form of kernel density estimators, in  which  the  bandwidth  is varied depending on the location of the sample points. In this paper‎, we  initially introduce the k-nearest neighbor kernel density estimator in the random left-truncatio...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1979